Bayesian optimization under mixed constraints with a slack-variable augmented Lagrangian
نویسندگان
چکیده
An augmented Lagrangian (AL) can convert a constrained optimization problem into a sequence of simpler (e.g., unconstrained) problems, which are then usually solved with local solvers. Recently, surrogate-based Bayesian optimization (BO) sub-solvers have been successfully deployed in the AL framework for a more global search in the presence of inequality constraints; however, a drawback was that expected improvement (EI) evaluations relied on Monte Carlo. Here we introduce an alternative slack variable AL, and show that in this formulation the EI may be evaluated with library routines. The slack variables furthermore facilitate equality as well as inequality constraints, and mixtures thereof. We show how our new slack “ALBO” compares favorably to the original. Its superiority over conventional alternatives is reinforced on several mixed constraint examples.
منابع مشابه
Some Theoretical Properties of an Augmented Lagrangian Merit Function
Sequential quadratic programming (SQP) methods for nonlinearly constrained optimization typically use a merit function to enforce convergence from an arbitrary starting point. We define a smooth augmented Lagrangian merit function in which the Lagrange multiplier estimate is treated as a separate variable, and inequality constraints are handled by means of non-negative slack variables that are ...
متن کاملA note on exploiting structure when using slack variables
We show how to exploit the structure inherent in the linear algebra for constrained nonlinear optimizaüon problems when inequality constraints have been converted to equations by adding slack variables and the problem is solved using an augmented Lagrangian method. AMS Subject Classification: 65K05, 90C30
متن کاملAlternating direction augmented Lagrangian methods for semidefinite programming
We present an alternating direction method based on an augmented Lagrangian framework for solving semidefinite programming (SDP) problems in standard form. At each iteration, the algorithm, also known as a two-splitting scheme, minimizes the dual augmented Lagrangian function sequentially with respect to the Lagrange multipliers corresponding to the linear constraints, then the dual slack varia...
متن کاملCopositive Relaxation Beats Lagrangian Dual Bounds in Quadratically and Linearly Constrained Quadratic Optimization Problems
We study non-convex quadratic minimization problems under (possibly non-convex) quadratic and linear constraints, and characterize both Lagrangian and Semi-Lagrangian dual bounds in terms of conic optimization. While the Lagrangian dual is equivalent to the SDP relaxation (which has been known for quite a while, although the presented form, incorporating explicitly linear constraints, seems to ...
متن کاملEfficient and Adaptive Lagrange-Multiplier Methods for Nonlinear Continuous Global Optimization
Lagrangian methods are popular in solving continuous constrained optimization problems. In this paper, we address three important issues in applying Lagrangian methods to solve optimization problems with inequality constraints. First, we study methods to transform inequality constraints into equality constraints. An existing method, called the slack-variable method, adds a slack variable to eac...
متن کامل